Skip to content

feat: configurable touchscreen and touchpad gesture settings#3771

Draft
julianjc84 wants to merge 38 commits into
niri-wm:mainfrom
julianjc84:feat/configurable-touch-gestures
Draft

feat: configurable touchscreen and touchpad gesture settings#3771
julianjc84 wants to merge 38 commits into
niri-wm:mainfrom
julianjc84:feat/configurable-touch-gestures

Conversation

@julianjc84
Copy link
Copy Markdown

@julianjc84 julianjc84 commented Apr 6, 2026

original post

Builds on #3180 — adds configurable settings for touch gestures.

Changes

  • Per-gesture finger-count, sensitivity, and off toggle (touchpad
    & touchscreen)
  • recognition-threshold for gesture direction locking
  • Touchscreen per-gesture natural-scroll (touchpad inherits from
    device-level libinput)
  • 12 new bind triggers: TouchpadSwipe{3,4,5}{Up,Down,Left,Right}
    modifier + swipe fires discrete actions, bare swipes remain continuous
    gestures
  • Gesture logic extracted into src/input/touch_gesture.rs
binds {
    Mod+TouchpadSwipe3Right { spawn "rofi" "-show" "drun"; }
    Mod+TouchpadSwipe4Left { focus-workspace-up; }
}

Addresses #3180 and #372 — sensitivity too high, hardcoded gestures, no
way to disable or bind swipe gestures.

Settings UI available separately: niri-touch-settings-UI

Depends on: #3180

Feature Rich Demo

Watch the demo

https://youtu.be/eQJSEfhol9c

Configurable touch gestures

Replaces the old hardcoded gesture set with a fully configurable touch
subsystem for both touchpad and touchscreen.

What's implemented

  • Property-form triggersTouchSwipe, TouchPinch, TouchRotate,
    TouchEdge, and TouchpadSwipe as KDL nodes with properties, e.g.
    TouchSwipe fingers=3 direction="up".
  • Families: swipe, pinch, rotate, and edge swipes for touchscreen;
    swipe for touchpad.
  • 3–10 finger binds per family.
  • Edge zones — each screen edge split into thirds, so a bind can
    target e.g. edge="right" zone="top" independently of the rest of
    the edge.
  • Rotation recognition with its own trigger angle and dominance
    ratio, so rotate can win against swipe/pinch cleanly.
  • Tag IPC — any bind can carry tag="...", and niri publishes
    begin/progress/end events for that tag. External clients subscribe
    without ever touching raw input (see companion repos below).

What users can change

Per bind:

  • Finger count (3–10)
  • Direction (up/down/left/right, in/out, cw/ccw)
  • Edge + zone (for TouchEdge)
  • Any niri action as the body — or noop with a tag for pure IPC
  • sensitivity, natural-scroll, tag

Global tuning (in input { touchscreen { gestures { … } } } and the
touchpad equivalent):

  • Swipe / pinch / rotate trigger distances and angles
  • Pinch and rotate dominance ratios (how strictly each family has to
    beat the others to win)
  • Progress distances / angles (how recognition maps to IPC progress)
  • Edge start-zone width
binds {
    TouchSwipe fingers=3 direction="up"  tag="ws-nav" { focus-workspace-up; }
    TouchEdge  edge="right" zone="top"   tag="zone-right-top" { noop; }
    Mod+TouchpadSwipe fingers=4 direction="left" { focus-workspace-up; }
}

Companion tools

@Sempyos Sempyos added area:input Keyboard, mouse, touchpad, tablet, gestures, pointer area:config Config parsing, default config, new settings pr kind:feature New features and functionality labels Apr 6, 2026
Comment thread niri-config/src/input.rs Outdated
self.gestures
.as_ref()
.and_then(|g| g.workspace_switch.as_ref())
.map_or(self.natural_scroll, |a| a.natural_scroll || self.natural_scroll)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're using ||, so will per-gesture overrides for this still work?

Comment thread niri-config/src/input.rs Outdated
self.gestures
.as_ref()
.and_then(|g| g.view_scroll.as_ref())
.map_or(self.natural_scroll, |a| a.natural_scroll || self.natural_scroll)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same question as previous.

Comment thread niri-config/src/input.rs Outdated
self.gestures
.as_ref()
.and_then(|g| g.overview_toggle.as_ref())
.map_or(self.natural_scroll, |a| a.natural_scroll || self.natural_scroll)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here too.

Comment thread niri-config/src/input.rs
impl ScrollFactor {
pub fn h_v_factors(&self) -> (f64, f64) {
let base_value = self.base.map(|f| f.0).unwrap_or(1.0);
let base_value = self.base.map(|f| f.0).unwrap_or(0.4);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume you changed this for the touch-screen gestures, but don't the touch_pad_ gestures also use this? How will they be affected?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, correct. Last night I separated out touchpad that is hardcoded in.
So now touch, renamed touchscreen and touchpad are separated and fully independent.

There is still the issue with naturalscroll. And why I'm looking at lua config implementation.

Copy link
Copy Markdown
Contributor

@Atan-D-RP4 Atan-D-RP4 Apr 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, correct. Last night I separated out touchpad that is hardcoded in. So now touch, renamed touchscreen and touchpad are separated and fully independent.

Got it.

The lua config idea is overkill for most of the config stuff exposed by Niri. The natural-scroll case of here included. My implementation is, for all it's seeming complexity, only able to expose the config that kdl already does well, and the events/commands on the IPC, with some async handlers.

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

Is it possible to also do screen-edge gestures? Shouldn't be too complicated, I would think.

@julianjc84
Copy link
Copy Markdown
Author

  1. Removed per-gesture natural-scroll for touchpad — touchpad inherits from
    device-level libinput, touchscreen keeps it
  2. Fixed finger count bug — 3-finger swipe could trigger a 4-finger gesture
    (the || bug Atan flagged)
  3. Added 12 discrete swipe bind triggers —
    Mod+TouchpadSwipe{3,4,5}{Up,Down,Left,Right}

@julianjc84
Copy link
Copy Markdown
Author

julianjc84 commented Apr 7, 2026

Latest push adds:

  1. Touchscreen edge swipe gestures — single-finger swipe from screen edge
    triggers compositor gestures. Config lives under nested input { touchscreen
    { gestures {} } }:
input {
    touchscreen {
        gestures {
            workspace-switch {
                natural-scroll
                finger-count 3
                sensitivity 0.2
            }
            view-scroll {
                natural-scroll
                finger-count 3
                sensitivity 0.2
            }
            overview-toggle {
                natural-scroll
                finger-count 4
                sensitivity 0.2
            }
            recognition-threshold 26.0
            edge-threshold 30.0
            edge-swipe-left "view-scroll" {
                sensitivity 0.6
            }
            edge-swipe-right "view-scroll" {
                sensitivity 0.6
            }
            edge-swipe-top "workspace-switch" {
                sensitivity 0.5
            }
            edge-swipe-bottom "workspace-switch" {
                sensitivity 0.5
            }
        }
    }
}

Actions: "view-scroll", "workspace-switch", "overview-toggle"

  1. Gesture lock fix — once a multi-finger gesture direction is decided,
    client events stay suppressed until all fingers lift. Prevents mid-gesture
    finger drops from leaking touch events to apps (e.g. 3→2 fingers triggering
    browser pinch-zoom).

@julianjc84
Copy link
Copy Markdown
Author

https://github.com/julianjc84/niri/tree/feat/touch-lua

Adds an optional Lua scripting layer for touchscreen gesture handling, plus
centroid-based pinch gesture detection.

  • Lua touchscreen control: When ~/.config/niri/touchscreen.lua exists, it
    becomes the exclusive controller for all touchscreen gesture decisions
    (multi-finger swipes, edge swipes, pinch). When absent, KDL config handles
    everything unchanged — zero overhead.
  • Pinch gesture detection: New centroid spread algorithm discriminates
    pinch vs swipe using configurable threshold and ratio. Supports both
    discrete actions (e.g. 5-finger pinch-in → close window) and continuous
    gestures (3-finger pinch → overview toggle).
  • Finger-count awareness: Cumulative normalization (divide by finger count)
    and configurable threshold scaling so 5-finger gestures aren't
    hyper-sensitive compared to 3-finger ones.
  • Unlock-on-new-finger: When additional fingers land after gesture lock,
    recognition resets — enabling reliable 4/5-finger gestures that start as
    3-finger touches.

Why Lua?

Wayland compositors are deliberately restrictive about input access —
unlike X11, third-party apps cannot intercept or observe raw touch events.
Touch input is sandboxed to the focused surface, which means gesture
recognition must live inside the compositor. There's no way for an external
gesture daemon to tap into multi-finger touchscreen events the way tools
like libinput-gestures or fusuma work for touchpads (via /dev/input).

KDL is static — it maps finger counts to fixed actions but can't express
conditional behavior. Lua embedding solves both problems by giving users
extensibility inside the compositor:

  • Different actions per finger count (3-finger swipe vs 5-finger swipe)
  • Context-aware gestures (check if window is fullscreen, per-app overrides)
  • Pinch vs swipe discrimination with custom logic
  • Edge swipe actions that vary by edge direction
  • All without forking the compositor or adding new KDL syntax for every use
    case

Lua callbacks return simple values (false = fall through, true = handled,
"workspace_switch" = override to continuous gesture), keeping the
integration minimal

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

Atan-D-RP4 commented Apr 8, 2026

That's pretty cool. Is it possible to do something like have the gesture state be used for creating animated gestures that are not provided by the compositor?
Something like letting the gesture progress state drive animations from quickshell widgets and such? I've wanted something like that for any implementation of configurable gestures.

Though even for that, embedding Lua for this is simply not in-line with the design of the main project. Maybe exposing that state, which you do through the Lua API/callbacks now, can be done via IPC instead.

YaLTeR did have some intent and idea for expanding the IPC surface. This needs a lot of discussion and serious design work.

@julianjc84
Copy link
Copy Markdown
Author

Thanks for the feedback — took it on board.

Dropped the Lua approach entirely and went with a KDL bind system instead. I dont know what i was thinking.
Touchscreen gestures now use the same pattern as keyboard binds:

input {
    touchscreen {
        gestures {
            // Detection tuning
            recognition-threshold 26.0
            edge-threshold 30.0
            pinch-threshold 20.0
            pinch-ratio 2.0
            pinch-sensitivity 1.0
            finger-threshold-scale 2.6

            touch-binds {
                // 3-finger swipes — workspace switch (vertical) + view scroll (horizontal)
                Touch3SwipeUp    natural-scroll=true { focus-workspace-up; }
                Touch3SwipeDown  natural-scroll=true { focus-workspace-down; }
                Touch3SwipeLeft  natural-scroll=true { focus-column-right; }
                Touch3SwipeRight natural-scroll=true { focus-column-left; }

                // 4-finger — overview toggle
                Touch4SwipeUp   { toggle-overview; }
                Touch4SwipeDown { toggle-overview; }

                // 5-finger — discrete actions
                Touch5SwipeDown { close-window; }
                Touch5SwipeUp   { screenshot; }

                // Pinch gestures (use discrete open/close, not continuous toggle)
                Touch3PinchIn  { open-overview; }
                Touch3PinchOut { close-overview; }
                Touch4PinchIn  { open-overview; }
                Touch4PinchOut { close-overview; }
                Touch5PinchIn  { close-window; }
                Touch5PinchOut { screenshot; }

                // Edge swipes
                TouchEdgeLeft   sensitivity=0.6 natural-scroll=true { focus-column-right; }
                TouchEdgeRight  sensitivity=0.6 natural-scroll=true { focus-column-left; }
                TouchEdgeTop    sensitivity=0.5 natural-scroll=true { focus-workspace-up; }
                TouchEdgeBottom sensitivity=0.5 natural-scroll=true { focus-workspace-down; }
            }
        }
    }
}

The compositor infers continuous vs discrete from the action, focus-workspace-up, focus-column-right, toggle-overview drive animations that track your finger, everything else (like close-window, screenshot) fires once at threshold. If unsure, just bind the action you want. The compositor knows which ones animate and which ones fire.

Still interested in the IPC gesture progress idea for driving external widget animations (QuickShell etc). The bind system maps out what gesture state is useful to expose finger count, direction, deltas, pinch spread, continuous progress which could inform an IPC event surface design down the line.

Open to feedback on the approach.

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

This is much better!

With the way that's implemented, you can likely move the entire touch-binds child nodes into binds and remove binds, and leave the rest of the config knobs in gestures {}.
With per gesture configs still being set in this:
TouchEdgeLeft sensitivity=0.6 natural-scroll=true { focus-column-right; }

Still interested in the IPC gesture progress idea for driving external widget animations (QuickShell etc). The bind system maps out what gesture state is useful to expose finger count, direction, deltas, pinch spread, continuous progress which could inform an IPC event surface design down the line.

The idea, well mine at least, is that we have new events such as GestureBegin, GestureProgress and GestureEnd, with associated context data (such as maybe some tags?). Then external processes can listen on the event stream for those events, match those tags, which we would set in the compositor config and use the event data to drive their custom animations.

There's a probably a better spin on this idea though. But seems like a good starting point as any.

@julianjc84
Copy link
Copy Markdown
Author

That is next level genius.

Moving touch-binds into binds {} makes sense and it unlocks modifier combos for free.
Mod+Touch3SwipeUp { move-window-to-workspace-up; }
would just work since the bind parser already handles modifier prefixes.

The same approach should apply to touchpad gestures too dont you think?
Touchpad3SwipeUp as a trigger in binds {}, with the same modifier support so
Mod+Touchpad3SwipeDown is a natural gesture.

That would give us one unified bind system: keyboard, touchscreen, and touchpad all sharing the same binds {} block, same action set, same modifier combos.

@Sempyos
Copy link
Copy Markdown
Member

Sempyos commented Apr 8, 2026

Perhaps it would better to have a fingers=NUMBER instead of having it in the name of the bind

@julianjc84
Copy link
Copy Markdown
Author

Moved touch gestures into the main binds {} block — touchscreen and touchpad now share the same bind system as keyboard and mouse. This gives us modifier combos.

binds {
    // Touchscreen gestures
    Touch3SwipeUp    natural-scroll=true { focus-workspace-up; }
    Touch4SwipeUp   { toggle-overview; }
    Touch3PinchIn   { open-overview; }
    TouchEdgeLeft   sensitivity=0.6 natural-scroll=true { focus-column-right; }

    // Modifier + touchscreen (hold Super + swipe)
    Mod+Touch3SwipeUp { move-window-to-workspace-up; }

    // Touchpad gestures (same system)
    TouchpadSwipe3Up { focus-workspace-up; }

    // Modifier + touchpad
    Mod+TouchpadSwipe3Up { move-window-to-workspace-up; }
}

Detection thresholds stay in input { touchscreen/touchpad { gestures {} } } — they tune the recognition engine, not the actions.

Per-bind properties:

  • sensitivity=0.6 — speed multiplier for continuous gestures
  • natural-scroll=true — invert direction (touchscreen only, touchpad inherits from device setting)
Full test config (click to expand)
binds {
    // Touchscreen gestures
    Touch3SwipeUp    natural-scroll=true { focus-workspace-up; }
    Touch3SwipeDown  natural-scroll=true { focus-workspace-down; }
    Touch3SwipeLeft  natural-scroll=true { focus-column-right; }
    Touch3SwipeRight natural-scroll=true { focus-column-left; }
    Touch4SwipeUp   { toggle-overview; }
    Touch4SwipeDown { toggle-overview; }
    Touch5SwipeDown { close-window; }
    Touch5SwipeUp   { screenshot; }
    Touch3PinchIn  { open-overview; }
    Touch3PinchOut { close-overview; }
    Touch4PinchIn  { open-overview; }
    Touch4PinchOut { close-overview; }
    Touch5PinchIn  { close-window; }
    Touch5PinchOut { screenshot; }

    // Touchscreen Edge gestures
    TouchEdgeLeft   sensitivity=0.6 natural-scroll=true { focus-column-right; }
    TouchEdgeRight  sensitivity=0.6 natural-scroll=true { focus-column-left; }
    TouchEdgeTop    sensitivity=0.5 natural-scroll=true { focus-workspace-up; }
    TouchEdgeBottom sensitivity=0.5 natural-scroll=true { focus-workspace-down; }

    // Touchpad gestures
    TouchpadSwipe3Up    { focus-workspace-up; }
    TouchpadSwipe3Down  { focus-workspace-down; }
    TouchpadSwipe3Left  { focus-column-right; }
    TouchpadSwipe3Right { focus-column-left; }
    TouchpadSwipe4Up    { toggle-overview; }
    TouchpadSwipe4Down  { toggle-overview; }
}
input {
    touchscreen {
        gestures {
            recognition-threshold 26.0
            edge-threshold 30.0
            pinch-threshold 20.0
            pinch-ratio 2.0
            pinch-sensitivity 1.0
            finger-threshold-scale 2.6
        }
    }
    touchpad {
        tap
        dwt
        natural-scroll
        accel-speed 0.2
        accel-profile "adaptive"
        scroll-factor 0.9
        gestures {
            recognition-threshold 16.0
        }
    }
}

Tags are next...

@julianjc84
Copy link
Copy Markdown
Author

Perhaps it would better to have a fingers=NUMBER instead of having it in the name of the bind

Good point — fingers=N is cleaner syntax. The tradeoff i had to make with moving into binds {} is that the trigger name needs to be the complete match key (trigger + modifiers = unique bind). With fingers as a property, two binds could share the same trigger name but differ only by finger count, which the current bind lookup can't distinguish

@julianjc84
Copy link
Copy Markdown
Author

IPC Gesture Events — Implementation & Findings

We've implemented tagged gesture events over IPC, building on the binds {} unification. Gesture binds (touchscreen and touchpad) can carry a tag="name" property, which emits GestureBegin, GestureProgress, and GestureEnd events on the IPC event stream.

Three modes:

// Observe — niri runs the action, IPC mirrors live progress
Touch3SwipeUp tag="ws-nav" { focus-workspace-up; }

// IPC-only — gesture emits events, niri does nothing
Touch4SwipeUp tag="app-drawer" { noop; }

// Plain — no tag, no IPC events, just the action
Touch5SwipeDown { close-window; }

What the event stream looks like:

GestureBegin:    tag=ws-nav trigger=Touch3SwipeUp fingers=3 continuous=true
GestureProgress: tag=ws-nav progress=0.231 delta=(0.0,0.2) t=654070
GestureProgress: tag=ws-nav progress=0.228 delta=(-0.3,0.4) t=654097
GestureEnd:      tag=ws-nav completed=true

Pros:

  • External tools (QuickShell, eww, shell scripts) can drive custom animations synced to finger movement — no compositor changes needed per use case
  • Tags are opt-in — no tag means no IPC events, so the event stream isn't flooded
  • noop action lets gestures be claimed purely for external tools without any niri behavior

Security consideration:

  • Tags are scoped to gesture triggers only (touchscreen and touchpad). Keyboard/mouse binds do not emit IPC events, avoiding exposure of input patterns over the event stream.

Cons / Open questions:

  • noop currently fires as discrete (begin + end, no progress). Progress data is only available when niri is driving an animation (e.g. workspace switch). Continuous noop with raw delta streaming is a possible future addition
  • Mod+touch on touchscreens has a double-duty conflict — touch is both click and gesture input, so Mod+Touch3SwipeUp can interfere with Mod+click window move/resize when fingers land before the gesture threshold

Atan-D's original idea of tags for matching gesture events maps well. The tag on a bind is the identifier that external processes filter on. Combined with noop, it makes niri's gesture system extensible without a scripting runtime.

@julianjc84 julianjc84 changed the title feat: configurable touch gesture settings feat: configurable touchscreen and touchpad gesture settings Apr 9, 2026
@julianjc84
Copy link
Copy Markdown
Author

niri-tag-sidebar: Gesture Tag Proof of Concept

A GTK4 app that creates slide-out drawer panels driven by niri's IPC gesture events. Swipe from a screen edge, the panel follows your finger.

Repo: https://github.com/julianjc84/niri-tag-sidebar

Gesture binds with a tag property emit GestureBegin / GestureProgress / GestureEnd on niri's IPC event stream. External apps subscribe and react.

binds {
    TouchEdgeLeft  tag="sidebar-left"  { noop; }
    TouchEdgeRight tag="sidebar-right" { noop; }
}

noop makes the gesture continuous (emits progress events) without driving any compositor animation — the IPC stream is the sole output.

The sidebar maps progress (0→1) to panel reveal by adjusting layer-shell margins. On release, it snaps open or closed based on a threshold.

Tags also work on touchpad gestures and compositor-animated actions:

binds {
    TouchpadSwipe3Up  tag="ws-nav" { focus-workspace-up; }
    Touch3SwipeDown   tag="ws-nav" { focus-workspace-down; }
}

Known Issues

Progress mismatch on compositor gestures — When a tagged gesture also drives a compositor animation (workspace switch, etc), niri uses its own internal thresholds to decide when to commit. IPC progress is independent and may not match. For noop gestures this isn't an issue — progress is the sole output.

Touchscreen vs touchpad scale — Touchscreen deltas are screen pixels (tracks close to niri's internals). Touchpad deltas are libinput acceleration-adjusted units (nonlinear, harder to tune). Both have configurable gesture-progress-distance:

input {
    touchscreen { gestures { gesture-progress-distance 200.0 } }  // screen pixels
    touchpad    { gestures { gesture-progress-distance 40.0 } }   // libinput units
}

GestureEnd completed — Currently hardcoded true. Does not indicate whether niri actually committed the bound action.

Configs

touchscreen-gestures.kdl
input {
    touchscreen {
        gestures {
            // Detection tuning
            recognition-threshold 26.0
            edge-threshold 30.0
            pinch-threshold 20.0
            pinch-ratio 2.0
            pinch-sensitivity 1.0
            finger-threshold-scale 2.6
            gesture-progress-distance 200.0  // screen pixels for IPC progress 0→1
        }
    }
}

binds {
    // Touchscreen gestures
    Touch3SwipeUp    natural-scroll=true tag="ws-nav" { focus-workspace-up; }
    Touch3SwipeDown  natural-scroll=true tag="ws-nav" { focus-workspace-down; }
    Touch3SwipeLeft  natural-scroll=true tag="col-nav" { focus-column-right; }
    Touch3SwipeRight natural-scroll=true tag="col-nav" { focus-column-left; }
    Touch4SwipeUp   { toggle-overview; }
    Touch4SwipeDown { toggle-overview; }
    Touch5SwipeDown tag="test-discrete" { close-window; }
    Touch5SwipeUp   tag="test-discrete" { screenshot; }
    Touch3PinchIn  { open-overview; }
    Touch3PinchOut { close-overview; }
    Touch4PinchIn  { open-overview; }
    Touch4PinchOut { close-overview; }
    Touch5PinchIn  { close-window; }
    Touch5PinchOut { screenshot; }
    
    // TOUCHSCREEN EDGE
    // TouchEdgeLeft   sensitivity=0.6 natural-scroll=true { focus-column-right; }
    // TouchEdgeRight  sensitivity=0.6 natural-scroll=true { focus-column-left; }
    // TouchEdgeTop    sensitivity=0.5 natural-scroll=true { focus-workspace-up; }
    // TouchEdgeBottom sensitivity=0.5 natural-scroll=true { focus-workspace-down; }

    // Edge swipes for niri-tag-sidebar (IPC-only, no compositor action)
    TouchEdgeLeft   tag="sidebar-left"   { noop; }
    TouchEdgeRight  tag="sidebar-right"  { noop; }
    TouchEdgeTop    tag="sidebar-top"    { noop; }
    TouchEdgeBottom tag="sidebar-bottom" { noop; }

    // Mod+Touchscreen gestures (test — hold Super + touch gesture)
    Mod+Touch3SwipeUp    { spawn "notify-send" "Mod+Touch" "3-finger Swipe Up"; }
    Mod+Touch3SwipeDown  { spawn "notify-send" "Mod+Touch" "3-finger Swipe Down"; }
    Mod+Touch3SwipeLeft  { spawn "notify-send" "Mod+Touch" "3-finger Swipe Left"; }
    Mod+Touch3SwipeRight { spawn "notify-send" "Mod+Touch" "3-finger Swipe Right"; }
    Mod+Touch3PinchIn    { spawn "notify-send" "Mod+Touch" "3-finger Pinch In"; }
    Mod+Touch3PinchOut   { spawn "notify-send" "Mod+Touch" "3-finger Pinch Out"; }
    Mod+Touch4SwipeUp    { spawn "notify-send" "Mod+Touch" "4-finger Swipe Up"; }
    Mod+Touch4SwipeDown  { spawn "notify-send" "Mod+Touch" "4-finger Swipe Down"; }
    Mod+Touch4SwipeLeft  { spawn "notify-send" "Mod+Touch" "4-finger Swipe Left"; }
    Mod+Touch4SwipeRight { spawn "notify-send" "Mod+Touch" "4-finger Swipe Right"; }
    Mod+Touch4PinchIn    { spawn "notify-send" "Mod+Touch" "4-finger Pinch In"; }
    Mod+Touch4PinchOut   { spawn "notify-send" "Mod+Touch" "4-finger Pinch Out"; }
    Mod+Touch5SwipeUp    { spawn "notify-send" "Mod+Touch" "5-finger Swipe Up"; }
    Mod+Touch5SwipeDown  { spawn "notify-send" "Mod+Touch" "5-finger Swipe Down"; }
    Mod+Touch5SwipeLeft  { spawn "notify-send" "Mod+Touch" "5-finger Swipe Left"; }
    Mod+Touch5SwipeRight { spawn "notify-send" "Mod+Touch" "5-finger Swipe Right"; }
    Mod+Touch5PinchIn    { spawn "notify-send" "Mod+Touch" "5-finger Pinch In"; }
    Mod+Touch5PinchOut   { spawn "notify-send" "Mod+Touch" "5-finger Pinch Out"; }
    Mod+TouchEdgeLeft   { spawn "notify-send" "Mod+Edge" "Left"; }
    Mod+TouchEdgeRight  { spawn "notify-send" "Mod+Edge" "Right"; }
    Mod+TouchEdgeTop    { spawn "notify-send" "Mod+Edge" "Top"; }
    Mod+TouchEdgeBottom { spawn "notify-send" "Mod+Edge" "Bottom"; }
}
touchpad-gestures.kdl
input {
    touchpad {
        tap
        dwt
        natural-scroll
        accel-speed 0.2
        accel-profile "adaptive"
        scroll-factor 0.9
        gestures {
            recognition-threshold 16.0
            gesture-progress-distance 250.0   // libinput delta units for IPC progress 0→1
        }
    }
}

binds {
    // Touchpad gestures (plain — no modifier)
    TouchpadSwipe3Up    tag="ws-nav" { focus-workspace-up; }
    TouchpadSwipe3Down  tag="ws-nav" { focus-workspace-down; }
    TouchpadSwipe3Left  tag="col-nav" { focus-column-right; }
    TouchpadSwipe3Right tag="col-nav" { focus-column-left; }
    TouchpadSwipe4Up    { toggle-overview; }
    TouchpadSwipe4Down  { toggle-overview; }

    // Mod+Touchpad gestures (test — hold Super + swipe triggers notification)
    Mod+TouchpadScrollUp { spawn "notify-send" "Scroll" "2-finger Up"; }
    Mod+TouchpadScrollDown { spawn "notify-send" "Scroll" "2-finger Down"; }
    Mod+TouchpadScrollLeft { spawn "notify-send" "Scroll" "2-finger Left"; }
    Mod+TouchpadScrollRight { spawn "notify-send" "Scroll" "2-finger Right"; }
    Mod+TouchpadSwipe3Up { spawn "notify-send" "Swipe" "3-finger Up"; }
    Mod+TouchpadSwipe3Down { spawn "notify-send" "Swipe" "3-finger Down"; }
    Mod+TouchpadSwipe3Left { spawn "notify-send" "Swipe" "3-finger Left"; }
    Mod+TouchpadSwipe3Right { spawn "notify-send" "Swipe" "3-finger Right"; }
    Mod+TouchpadSwipe4Up { spawn "notify-send" "Swipe" "4-finger Up"; }
    Mod+TouchpadSwipe4Down { spawn "notify-send" "Swipe" "4-finger Down"; }
    Mod+TouchpadSwipe4Left { spawn "notify-send" "Swipe" "4-finger Left"; }
    Mod+TouchpadSwipe4Right { spawn "notify-send" "Swipe" "4-finger Right"; }
    Mod+TouchpadSwipe5Up { spawn "notify-send" "Swipe" "5-finger Up"; }
    Mod+TouchpadSwipe5Down { spawn "notify-send" "Swipe" "5-finger Down"; }
    Mod+TouchpadSwipe5Left { spawn "notify-send" "Swipe" "5-finger Left"; }
    Mod+TouchpadSwipe5Right { spawn "notify-send" "Swipe" "5-finger Right"; }
}
niri-tag-sidebar config (sample-config.toml)
# niri-tag-sidebar configuration
#
# Each [[panel]] defines a slide-out drawer bound to a gesture tag.
# The tag must match a `tag="..."` property on a gesture bind in your niri config.
#
# layer options: "overlay" (above waybar), "top" (same as waybar), "bottom", "background"

[[panel]]
tag = "sidebar-left"
edge = "left"
size = 400
snap_threshold = 0.4
bg_color = "rgba(40, 100, 220, 0.95)"
label = "Navigation"
layer = "overlay"

[[panel]]
tag = "sidebar-right"
edge = "right"
size = 400
snap_threshold = 0.5
bg_color = "rgba(220, 50, 50, 0.95)"
label = "Quick Settings"
layer = "overlay"

[[panel]]
tag = "sidebar-top"
edge = "top"
size = 250
snap_threshold = 0.3
bg_color = "rgba(50, 180, 80, 0.95)"
label = "Notifications"
layer = "overlay"

[[panel]]
tag = "sidebar-bottom"
edge = "bottom"
size = 300
snap_threshold = 0.6
bg_color = "rgba(220, 170, 30, 0.95)"
label = "Media Controls"
layer = "overlay"

# Progress bar — shows real-time gesture progress for workspace navigation
# Appears during 3-finger swipe (ws-nav tag) and auto-hides on release
[[panel]]
tag = "ws-nav"
edge = "bottom"
style = "bar"
bar_height = 40
bg_color = "rgba(100, 60, 200, 0.90)"
label = "Workspace"
layer = "overlay"

# Progress bar — shows real-time gesture progress for column navigation
# Appears during 3-finger horizontal swipe (col-nav tag) and auto-hides on release
[[panel]]
tag = "col-nav"
edge = "bottom"
style = "bar"
bar_height = 40
bg_color = "rgba(200, 120, 40, 0.90)"
label = "Column"
layer = "overlay"

What Tags Enable

Slide-out panels, gesture-driven launchers, custom workspace indicators, scrubbing controls (volume/brightness), accessibility tools — anything that wants live gesture progress without modifying the compositor.

@Sempyos
Copy link
Copy Markdown
Member

Sempyos commented Apr 10, 2026

Would you be able to provide a demo video of the app in motion..?

@julianjc84
Copy link
Copy Markdown
Author

Would you be able to provide a demo video of the app in motion..?

Yes, absolute, later today i will

@julianjc84
Copy link
Copy Markdown
Author

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

https://youtu.be/lGB4mEf5zrI

That is awesome!

@julianjc84
Copy link
Copy Markdown
Author

https://youtu.be/lGB4mEf5zrI

That is awesome!

Your ideas and guidance with staying on kdl +tags were instrumental in getting to this working proof of concept stage so thank you.

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

Well I'm still not fully happy with the design, because we have a sort of separation between where Niri configures the setup for those binds and where that setup is used to implement the action itself by the external app.

I was also thinking of something like having the spawn action have some different behavior when called with these binds. Attaching state to the process being spawned via environment variables, that the spawned process can then read and use to drive animations. This way, the state stays fully isolated, does not need to exposed over IPC, and removes the need for tags/tagged-events. Though I'm unsure how non-trivial it is, modifying the spawn action in that way.

This is already pretty awesome, but I think we can still do better.

@julianjc84
Copy link
Copy Markdown
Author

I updated the Repo: https://github.com/julianjc84/niri-tag-sidebar app to accept touch so it can be closed now for a better UI experience.
video : https://youtu.be/YsTgXJmMI_M

@Sempyos
Copy link
Copy Markdown
Member

Sempyos commented Apr 10, 2026

Thanky so much for the video !
It looks very nice and I love that

Our thoughts (as ones daily driving a tablet on niri) from this PR:

  1. Renaming the touch block to touchscreen is a breaking change, and should probably be left as touch
  2. For the example gesture block, I would find it helpful if there was descriptions for the rest of the options there (recognition-threshold, edge-threshold, pinch-threshold, pinch-ratio, pinch-sensitivity, and finger-threshold-scale
  3. (Read above comments) Having a fingers=NUMBER option would probably be perferable
  4. For some reason, when using foot --server and starting doing a gesture, I am no longer able to scroll or double click to select text. Unsure as to why, but fixed itself at one point and then broke again
  5. midly annonying not having gesture inhibited from apps but had similar issues with lisgd so doesn't matter much to me
  6. I feel there should be opt-in 2 finger gesture binds for those who need them, as touchscreen gestures more than 3 fingers starts to become cumbersome
  7. Along with that, edge gestures would be lovely, ie: swipe top, left, down, right from top, left, bottom, right edges similar to what lisgd and Android have
  8. Mod + 3 finger swipe gesture can sometimes pick up a window and toggle it to floating rather consistently, which may be an issue
  9. sensitivity should also be documented with a comment
  10. Pinch gestures for overview and such should also have the smooth animations Touch4SwipeUp does for overview and similar, but mostly overview
  11. An example of scrubbing with brightnessctl or similar would be nice to have

Everything else to me seems to be working fine, I did not test tags as I am still unsure about how to utilize those, but otherwise lovely work !

if you ever want our feedback or testing please feel free to ping

julian and others added 16 commits May 5, 2026 05:09
Write-up of the design choices behind this branch's touchscreen gesture
stack, framed explicitly as a working prototype shaped by reviewer
feedback — not as niri's canonical design. Intended as a concrete
reference point for discussion, not a prescription.

Covers:
  - What Wayland provides (wl_touch raw, wp_pointer_gestures_v1 for
    touchpad) and the touchscreen gap
  - Why touchscreen gesture recognition doesn't live in libinput
    (direct manipulation, ambiguous recipient)
  - How iOS UIKit, Android's systemGestureExclusionRects, Linux phone
    shells, and userspace daemons (TouchEgg/lisgd/InputActions) solve
    or fail to solve the same problem
  - The specific choices this branch makes with labeled rationale:
    compositor-side recognizer, unified binds {} block, tag+IPC events,
    continuous noop, touchscreen-gesture-passthrough window rule,
    Mod+edge escape hatches, per-edge zoned triggers
  - Alternatives considered and rejected (dynamic client dialog,
    allow-forwarding per bind, zone granularity, auto-detection,
    external daemon, global disable toggle) with reasoning
  - Future directions: a minimal Wayland protocol modeled on Android's
    exclusion rects, unifying IPC progress with internal commit
    threshold, touchpad passthrough sibling rule, accurate completed
    flag on GestureEnd
  - Open questions explicitly inviting reviewer pushback

The doc's framing block makes clear this is one possible approach, not
the approach, and that the whole point of writing it down is to make
the rationale arguable rather than buried in code.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add TouchRotate3/4/5 Cw/Ccw triggers alongside the existing swipe and
pinch families (same hardcoded-per-finger-count shape — true arbitrary
N-finger parameterization is still open in TODO §6). Rotation is
detected from the averaged per-finger angle change around the cluster
centroid; cross-finger averaging gives a √N noise-floor improvement
over single-finger angular drift. Rotation classification runs before
pinch and swipe so a clearly twisting cluster wins over incidental
spread or translation.

Tuning lives under input { touchscreen { gestures { } } }:
rotation-threshold (min radians to latch), rotation-ratio (how much
rotation arc length must dominate swipe/spread by), and
rotation-progress-distance (radians that map to IPC progress ±1.0).

While here, refactor GestureProgress's overloaded delta_x / delta_y
fields into a typed GestureDelta enum (Swipe { dx, dy } / Pinch
{ d_spread } / Rotate { d_radians }) so each gesture family can carry
its own natural shape instead of stuffing rotation radians into an
empty y slot.

NOTE: rotation detection is an early PoC and is currently buggy and
intermittent on real hardware — recognition can misfire, lock at the
wrong finger count, or fail to latch. The math, IPC, unit tests, and
bind plumbing are all in place, but real-world tuning and edge cases
(lock grace period, finger-lift rebasing under rotation, noise
thresholds) still need debugging. Parking it here so the rest of the
work lands and the rotation pass can resume later.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Collapse the 40 hardcoded Touch*/Touchpad* enum variants into 5
parameterized struct variants driven by KDL properties, so the
compositor no longer hardcodes the legal finger counts into the trigger
name. User configs go from:

    TouchSwipe3Up        { focus-workspace-up; }
    TouchEdgeTop:Left    { spawn "..."; }
    TouchpadSwipe4Left   { ... }

to the property form:

    TouchSwipe    fingers=3 direction="up"            { focus-workspace-up; }
    TouchEdge     edge="top" zone="left"              { spawn "..."; }
    TouchpadSwipe fingers=4 direction="left"          { ... }

Each family keeps its own knuffel-validated property set, so invalid
combinations fail at parse time:
- TouchSwipe / TouchpadSwipe: fingers=N (3..=10), direction="up|down|left|right"
- TouchPinch:  fingers=N (3..=10), direction="in|out"
- TouchRotate: fingers=N (3..=10), direction="cw|ccw"
- TouchEdge:   edge="left|right|top|bottom", optional zone=
  (zone vocab rotates per edge axis: top/bottom edges take
   left|center|right; left/right edges take top|center|bottom).

Why:
- Arbitrary finger counts (up to 10) are now supported with zero enum
  churn. Touchscreens and large trackpads that report 6+ contacts are
  no longer capped at 5.
- Syntax matches the rest of niri's KDL config, which already uses
  property-based attributes everywhere (`tag=`, `natural-scroll=`,
  `cooldown-ms=`, `allow-when-locked=`). Touch triggers were the one
  family that hardcoded structural parameters into the node name.
- The old "can't distinguish finger counts at bind lookup" objection
  (from an earlier half-design where the count lived on `Bind` instead
  of inside `Trigger`) is moot: putting `fingers` inside the `Trigger`
  struct variant keeps `Eq`/`Hash`/bind lookup unchanged.

Hard break — no legacy compat:
- Old `TouchSwipe3Up` / `TouchEdgeTop:Left` / `TouchpadSwipe4Left` etc.
  no longer parse. Users get a clear parse error pointing at the new
  property form.
- IPC `GestureBegin.trigger` string wire format also changes: events
  now echo the same property shape that users write in binds, e.g.
  `TouchSwipe fingers=3 direction="up"`. Consumers doing string-match
  on trigger names must adapt.

Implementation:

- `niri-config/src/binds.rs`: Trigger enum refactor with 5 struct
  variants, new `SwipeDirection` / `PinchDirection` / `RotateDirection`
  helper enums, shared `MIN_FINGERS` / `MAX_FINGERS` constants (3..=10),
  private `GestureTriggerProps` + `build_gesture_trigger()` builder
  with per-family validation, `Bind::decode_node` routes gesture-family
  node names to property-based parsing while keyboard / mouse / wheel /
  TouchpadScroll keep the FromStr fast path.
- `niri-config/src/input.rs`: `ScreenEdge::as_kdl_name()` method and
  module-level `zone_kdl_name(edge, zone)` helper as the single source
  of truth for the axis-rotating zone vocabulary. Parser round-trips
  through `zone_kdl_name` instead of hand-rolling the inverse mapping.
- `src/input/touch_gesture.rs`: recognition constructs the new struct
  variants directly from (gesture kind, finger count); `MIN_FINGERS` /
  `MAX_FINGERS` imported from niri-config so there's one range source.
  `trigger_to_ipc_name` takes `Trigger` directly (was `Option<Trigger>`),
  generates the property-form string dynamically, uses `zone_kdl_name`
  and `ScreenEdge::as_kdl_name`, and `debug_assert!`s loudly if it's
  ever called with a non-gesture trigger. Trivial `edge_to_trigger` /
  `edge_zone_to_trigger` wrappers inlined at their two call sites.
- `src/input/mod.rs`: `swipe_trigger()` accepts any N in
  `MIN_FINGERS..=MAX_FINGERS`, not just 3/4/5. The duplicate
  `touchpad_trigger_ipc_name` is deleted; its caller routes through
  `touch_gesture::trigger_to_ipc_name` (which already handled
  `TouchpadSwipe`).
- `src/ui/hotkey_overlay.rs`: display labels generated from fingers +
  direction via helper fns, with `format_touch_edge_label` routing
  through the shared `zone_kdl_name` / `as_kdl_name` helpers.
- `niri-ipc/src/lib.rs`: `GestureBegin.trigger` docstring updated with
  examples of every new family string form.

Note on duplicate properties: knuffel 3.2 stores `SpannedNode::properties`
as `BTreeMap<SpannedName, Value>` and silently takes the last value on
name collision (see knuffel-3.2.0/src/grammar.rs:566). So
`TouchSwipe fingers=3 fingers=5 direction="up"` resolves to `fingers=5`
with no error. This is knuffel-spec behavior, not something niri can
intercept without a separate pre-parse pass over raw KDL source. The
observed last-wins behavior is documented in a regression test.

Tests:
- 15 unit tests in `niri-config/src/binds.rs` exercise
  `build_gesture_trigger` directly: arbitrary fingers=3..=10, per-family
  direction validation, edge zone vocab rejection, fingers out of range.
- 16 integration tests go through `Config::parse_mem`, covering the
  full Bind::decode_node two-phase parse path: modifier stripping
  (`Mod+TouchSwipe ...`, multi-modifier `Mod+Shift+TouchEdge ...`),
  tag on gesture vs keyboard-bind-rejection, gesture property on
  keyboard bind rejection, unknown property rejection, TouchEdge with
  zone and vocab mismatch rejection, TouchpadSwipe with modifier,
  TouchRotate, TouchPinch direction="in" and "out", fingers-out-of-range,
  cross-family direction rejection (swipe rejecting "cw"), and the
  knuffel duplicate-property last-wins behavior.
- Full niri test suite: 198 pass.

Docs:
- `docs/wiki/Gestures.md`: rewritten to document the four node families
  with property grammar and examples, including a rotation section
  that warns about the early-PoC state of rotation recognition.
- `docs/wiki/Configuration:-Input.md`: touchscreen / touchpad tuning
  subblocks updated with the new bind form.
- `docs/wiki/Design:-Touchscreen-Gestures.md` §5.2: rewritten. The
  prior rationale explicitly argued *against* `fingers=N` and for
  hardcoded node names; that position is now reversed, with the new
  section explaining what changed and why.
- `resources/default-config.kdl`: user-facing commented examples
  migrated to the new syntax so first-install users don't copy broken
  snippets.

Pre-existing unrelated issue not touched by this commit: there is a
stale `tests::parse` insta snapshot in `niri-config/src/lib.rs` from
earlier commits that added tag / sensitivity / natural-scroll fields to
`Bind` without refreshing the snapshot. Needs a separate
`cargo insta review` pass.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Debug-log driven investigation of 5-finger rotation reliability (details
in ~/aaaLOG_Niri analyzer output) surfaced three independent bugs that
together kept drift rotations at 0% and clean rotations at ~80%.

Bug 1 - finger-up spread basis discontinuity
  src/input/touch_gesture.rs on_touch_up
  The rotation basis was rebased on finger-lift but the spread basis
  was not. When a user lifted a finger during unlocked recognition,
  current_spread jumped (geometry change, not finger motion) and
  spread_change = (current - initial).abs() spiked past pinch_threshold
  on the very next frame, latching a spurious PinchIn. Observed as
  finger-lift retries of 5-finger rotations turning into unwanted
  PinchIn at fingers=4.
  Fix: during unlocked recognition with 3+ fingers still down, rebase
  initial_spread to the post-removal geometry so spread_change resets
  to zero across the discontinuity. The locked-pinch rebase already
  handles the other case.

Bug 2 - swipe LOCK has no dominance check
  src/input/touch_gesture.rs FRAME classification
  is_rotate required rotation_arc to dominate swipe_distance by the
  rotation_ratio, but the swipe LOCK branch was a naked threshold
  crossing (swipe_distance >= threshold). Result: any hand drift over
  the swipe threshold (~110 px for 5 fingers) instantly latched as a
  swipe, even when the user was visibly rotating with rot > 40 deg and
  arc > 250 px. The thresholds raced and swipe always won because its
  check was strictly weaker.
  Fix: gate the swipe LOCK branch behind !rotation_candidate, where
  rotation_candidate = (finger_count >= 3 && rotation_arc >=
  rotation_arc_threshold). Once rotation arc is at candidate quality,
  swipe cannot steal via the race - it only latches when rotation is
  not a plausible classification.

Bug 3 - rotation_ratio default inverted vs intent
  niri-config/src/input.rs rotation_ratio
  Default was 0.5, and the is_rotate check does rotation_arc >=
  swipe_distance * (1.0 / rotation_ratio), i.e. arc >= 2 * swipe. For
  a user rotating at 5 fingers with natural hand drift, this gate is
  essentially unreachable - arc is bounded by finger reach, swipe is
  not. Meanwhile pinch_ratio=2.0 is used directly without inversion,
  so the two ratios have opposite semantics for the same field name
  (higher=stricter for pinch, higher=lenient for rotate).
  Fix: bump default from 0.5 to 2.0, giving an effective gate of
  arc >= swipe * 0.5 (rotation arc only needs to be half the swipe
  distance). Doc comment rewritten to reflect the leniency semantics.

Config UX - angle values now in degrees
  niri-config/src/input.rs rotation_threshold, rotation_progress_distance
  rotation-threshold was a raw radian value (default 0.2618) and
  rotation-progress-distance was another (default 1.5708 = pi/2).
  Users reading the KDL had to convert degrees to radians in their
  head. Switched both to degrees in the config surface - accessors
  convert to radians internally, callers see no change. Defaults are
  now 15.0 and 90.0. Docs updated.

Telemetry bars in debug logs
  No new code. The existing TOUCH-DBG FRAME debug_log site at line
  ~819 already emits per-frame swipe/spread/rot/arc/is_rotate/closest
  values which is what the investigation was based on - leaving it as
  is so future debug cycles can reuse the journalctl -> analyze.py
  workflow.

Results on the four-scenario benchmark (5 finger clean/drift/retry):
  baseline: 47% (9/19)  - 5 stolen by swipe, 2 false PinchIn, 1 swipe-steal
  t3 (bug1+bug2): 57%   - drift 0% -> 56%, retry correctly no-motion
  t4 (+ratio fix): 75%  - all remaining failures are user under-rotation
                          (rot < 15 deg threshold) or extreme drift
                          (arc/swipe < 0.5)
Adds Event::RecognitionFrame (debug-only IPC, gated on debug_assertions)
so external tools like niri-gesture-inspector can visualize the
touchscreen gesture classifier state per frame. Also lands an overhaul
of the classifier itself.

Knob rename for clarity:

- All touchscreen + touchpad gesture knobs renamed to a unified
  vocabulary: `*-trigger-distance` (commit gates), `*-progress-distance`
  (IPC scaling), `*-dominance-ratio` (race tuning, "higher = stricter"
  for both pinch and rotation), `*-multi-finger-scale` (per-finger
  multipliers). Same knob names span both touchscreen and touchpad
  blocks; defaults differ because units differ (px vs libinput delta).
- IPC RecognitionFrame + GestureBegin field names match the new
  knob names.
- Doc + default-config.kdl examples updated to match.

Pinch commit-gate bugs fixed:

- The commit gate `(is_pinch && spread_change >= swipe_trigger)` was
  double-gating against the SCALED swipe trigger, so 4/5-finger pinches
  inherited a wildly inflated commit threshold (e.g. 234/378 px at
  swipe-multi-finger-scale 2.6) even though pinch-trigger-distance
  stayed flat. Pinch now commits on `is_pinch` alone — pinch-trigger
  is the single threshold for the pinch path.
- The pinch dominance check had a hardcoded ratio-1.0 fallback ORed
  into the primary gate, capping the effective dominance ratio at
  min(knob, 1.0). Any setting > 1.0 was silently masked. Removed the
  fallback so pinch-dominance-ratio is now a real knob across its full
  range (lower = lenient, higher = stricter).

Default tuning (touchscreen):

- swipe-trigger-distance:    16  → 100
- pinch-trigger-distance:    30  → 100
- pinch-dominance-ratio:     2.0 → 1.0
- swipe-multi-finger-scale:  1.5 → 1.2  (small pinch-priority bias at
                                         4/5-finger so ambiguous motions
                                         resolve as pinch over swipe)
- edge-start-distance:       20  → 30
- rotation-trigger-angle:    15° → 20°

GestureBegin/End emit unconditionally for multi-finger commits:

Multi-finger touchscreen gesture commits now emit GestureBegin/GestureEnd
unconditionally, with an empty `tag` for binds that have no user-set
tag. Previously these events were gated behind `tag.is_some()`, so
debug tools (niri-gesture-inspector) only saw the lock for binds the
user happened to have tagged — making "last lock" stick on the most
recent tagged swipe forever, even when the user was actually
pinching/rotating untagged binds. External consumers filter on tags
they care about, so empty-tag events are a no-op for them. Edge swipes
and touchpad gestures still emit only for tagged binds.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add TouchTap { fingers } trigger for binding stationary N-finger taps
(3-10 fingers). Tap detection runs in parallel with swipe/pinch/rotate
recognition using a spatial dead zone — matching the approach used by
Android, iOS, libinput, and Windows.

Tap candidate starts when 3+ fingers land, is killed if any finger
drifts beyond tap-wobble-threshold (default 15px) or when the
recognizer locks a motion gesture, and fires when all fingers lift
within tap-timeout-ms (default 250ms). No latency added to swipe
recognition.

Config: `tap-wobble-threshold` and `tap-timeout-ms` in the
`touchscreen { gestures { } }` block.

Also fixes a pre-existing bug where `touchscreen_gesture_passthrough`
was not cleared in `on_touch_cancel`, which could permanently suppress
gesture recognition after a cancel event on a passthrough window.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Defer gesture recognition and continuous animation feeding from
per-slot on_touch_motion to on_touch_frame. libinput delivers separate
TouchMotion events per finger per hardware scan — previously each one
triggered spread/rotation recalculation, animation feeding, and
queue_redraw_all(). With 3 fingers at 120Hz this caused 360 gesture
updates/sec instead of 120.

Now on_touch_motion only updates positions and accumulates deltas.
on_touch_frame runs recognition + feed once per scan with all
positions final — more efficient and more accurate (spread/rotation
computed on complete geometry instead of partially-updated state).

Also adds TouchTap documentation to the wiki Gestures page.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…TouchpadTapHoldDrag)

Add two new touchpad gesture triggers using libinput's hold gesture API:

- TouchpadTapHold: fires on release after stationary hold (3+ fingers).
  Fast taps pass through to clients (e.g. terminal 3-finger paste).
- TouchpadTapHoldDrag: fires when held fingers start moving, reusing
  the existing swipe bind infrastructure for continuous animations.
  Distinguishes from direct swipes via the hold-before-move pause.

Also fixes the pre-existing tests::parse snapshot failure (touch →
touchscreen rename, missing fields on Bind and WindowRule structs).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add TouchTapHoldDrag trigger for touchscreen — fires when fingers hold
stationary (tap candidate alive) then start moving past wobble threshold.
Distinguishes from direct swipes via tap-hold-trigger-delay-ms (default
150ms): fast swipes bypass hold-drag and enter normal recognition.

Supports optional direction= property for per-direction binds after hold
(directional checked first, omnidirectional fallback). Can drive continuous
actions via existing ActiveTouchBind::Swipe infrastructure.

New config knob: tap-hold-trigger-delay-ms in touchscreen gestures block.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- edge-start-distance: 30 → 12 (narrower edge zone feels more natural)
- tap-timeout-ms: 250 → 500 (allows slightly deliberate taps)
- tap-hold-trigger-delay-ms: 150 → 200 (reduces fast-swipe false positives)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add the Gesture IPC Refactor RFC to the wiki — design plan for replacing
the tag system with env-var spawn context, stdin progress pipes, a public
IPC event stream, noop=consume semantics, and per-window binds {} in
window-rules for fingers=1/2 disambiguation. Sourced from PR niri-wm#3771 review
discussion (Atan-D-RP4 originated the core ideas).

Also convert the Status banner on Design:-Touchscreen-Gestures.md from a
bold-prefixed quote to a [!IMPORTANT] admonition, per the docs policy in
Development:-Documenting-niri.md.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Per Development:-Developing-niri.md, debug! is for normal-operation
messages whose absence shouldn't hurt UX, while trace! is for spammy /
performance-intensive diagnostics — and trace! is compiled out of
release builds.

TOUCH-DBG FRAME fires every motion event during gesture recognition
(multiple times per frame, every gesture). That's textbook trace! —
useful when actively tuning the recognizer, but not something a release
build should pay for.

The other TOUCH-DBG tags (FINGER-LAND, UNLOCK, LOCK, TAP) stay at debug!
since they fire at gesture lifecycle events, which are rare and fit the
"normal operation" definition.

To see FRAME logs in dev builds, use RUST_LOG=niri=trace.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…lots

Multi-finger gesture onset forwards wl_touch.down for the first two
fingers before the 3rd finger flips the gesture gate. The matching up
events are then suppressed by that same gate — leaving clients with
two phantom touch slots, observed as GTK/Qt dialog buttons ignoring
single-finger taps until a two-finger tap happens to reuse those slot
IDs and drain the sequences.

Track forwarded slots in `Niri::touch_forwarded_slots` and at the
transition into gesture tracking emit explicit handle.up for each
previously-forwarded slot plus handle.cancel + handle.frame so the
client can't hold them as phantoms. Clear on all-fingers-up and on
TouchCancel.

The existing handle.cancel at the swipe/pinch/rotate LOCK path still
covers the case where LOCK fires during motion without a new finger
DOWN, but was insufficient for the common 3rd-finger transition path.

Also ships compositor-side TOUCH-DBG FORWARD / BLOCKED / CANCEL-CLIENT
tracing that was essential to diagnose this — the prior assumption
that touch_gesture_locked was sticky turned out wrong, but the logs
immediately revealed the real "first two forwarded, rest blocked"
pattern.

Addresses reviewer report that foot --server scroll/select breaks
after a touch gesture ("self-healed and then re-broke") — same root
cause.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Pure formatting changes (whitespace, line wrapping). No logic changes.
Resolves rustfmt CI failure.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Adds a `TouchpadPinch { fingers, direction }` trigger alongside the
existing touchpad gestures. libinput emits pinch events natively for
2/3/4 fingers on touchpads, so the parser accepts fingers=2 here (vs
3+ for touchscreen which reserves 2-finger for client passthrough);
MIN_FINGERS gating is now per-family.

Classifier hooks into on_gesture_pinch_begin/update/end and fires the
bind once per gesture when `|scale - 1.0|` crosses a configurable
threshold. Scale-ratio units (not pixels) since libinput pre-normalizes
touchpad pinch data — new pinch-trigger-scale knob lives under
input.touchpad.gestures alongside swipe-trigger-distance, default 0.15.

Raw pinch events still forward to Wayland clients so app-side
pinch-to-zoom keeps working in parallel with the compositor bind.
Tagged binds emit discrete IPC GestureBegin/End events matching the
TouchpadTapHold pattern.

KDL syntax: `TouchpadPinch fingers=2 direction="in"`.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@julianjc84 julianjc84 force-pushed the feat/configurable-touch-gestures branch from 9657614 to 9695817 Compare May 4, 2026 21:20
@wdanilo
Copy link
Copy Markdown

wdanilo commented May 7, 2026

This is beyond amazing! Will this allow me to change gesture of scrolling desktops from 3 fingers to 4 fingers? :)

@julianjc84
Copy link
Copy Markdown
Author

This is beyond amazing! Will this allow me to change gesture of scrolling desktops from 3 fingers to 4 fingers? :)

Yes, it designed to work both trackpad and touch screen to bind N fingers to any action.

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

Have you decided on a direction with replacing/removing the tags stuff? And reconciling how to decide whether events should be forwarded to clients or not?
Apologies if these seem like obvious questions. There's a lot going on in this PR. And I haven't been able to test it yet.

@wdanilo
Copy link
Copy Markdown

wdanilo commented May 7, 2026

One more question - currently, touchpad config is pretty limited - for example, I wanted to have slower scroll, but increase the acceleration of the scroll when moving my fingers fast and make the inertia scroll stop more aggressively (to match macos like behavior). Its currently impossible. Can such control be added / is planned to be added? If so, I'd be in heaven :)

@julianjc84
Copy link
Copy Markdown
Author

Have you decided on a direction with replacing/removing the tags stuff? And reconciling how to decide whether events should be forwarded to clients or not? Apologies if these seem like obvious questions. There's a lot going on in this PR. And I haven't been able to test it yet.

I have not decided, I really have just been enjoying the Touch* features. This question is bigger than me. I just implemented your idea as a proof of concept with tag and it works pretty decent. What is the best way to get gesture actions outside of the compositor for 3rd party apps to consume is important and how to go about it i'm not sure. I was hoping the other 'touchscreen' and 'touchpad' features would draw people in and get some ideas flowing.
Absolutely could be ripped out of this PR and made only Touch* based. Have a separate branch for 'tags' exposing gestures?
open to all suggestions.

@Sempyos
Copy link
Copy Markdown
Member

Sempyos commented May 8, 2026

I'll probably provide some feedback soon

@julianjc84
Copy link
Copy Markdown
Author

One more question - currently, touchpad config is pretty limited - for example, I wanted to have slower scroll, but increase the acceleration of the scroll when moving my fingers fast and make the inertia scroll stop more aggressively (to match macos like behavior). Its currently impossible. Can such control be added / is planned to be added? If so, I'd be in heaven :)

I have not used a mac in 10+ years so i don't really understand the request. I will try get my hands on one and test.

@Atan-D-RP4
Copy link
Copy Markdown
Contributor

One more question - currently, touchpad config is pretty limited - for example, I wanted to have slower scroll, but increase the acceleration of the scroll when moving my fingers fast and make the inertia scroll stop more aggressively (to match macos like behavior). Its currently impossible. Can such control be added / is planned to be added? If so, I'd be in heaven :)

I think this is something that should be added on the libinput side. There's been a similar discussion about it in this thread, #3960.

I wonder if this can achieved via libinput plugins.

@julianjc84 julianjc84 closed this May 9, 2026
@julianjc84 julianjc84 reopened this May 9, 2026
@julianjc84 julianjc84 closed this May 9, 2026
@julianjc84 julianjc84 reopened this May 9, 2026
@julianjc84
Copy link
Copy Markdown
Author

One more question - currently, touchpad config is pretty limited - for example, I wanted to have slower scroll, but increase the acceleration of the scroll when moving my fingers fast and make the inertia scroll stop more aggressively (to match macos like behavior). Its currently impossible. Can such control be added / is planned to be added? If so, I'd be in heaven :)

I did not realise how federated Linux is with inputs and would explain the disconnected scrolling feel in apps.

Linux input is a federated pipeline with many ownership boundaries between fingertip and pixel. Each boundary owner makes locally reasonable decisions; the cumulative result is the segmented feel users notice between scrolling in apps.

macOS centralizes the whole pipeline. AppKit owns scroll simulation; every app inherits one consistent feel. Apple isn't better at scroll. Apple owns the boundary.

On Linux, post-lift kinetic ("inertia") lives in the app not the compositor, not libinput. You'll notice this everywhere: Firefox coasts, Thunar coasts, xfce4-terminal doesn't coast at all, and where coast exists the curve differs per app. Each app implements its own kinetic on top of the raw axis events the compositor forwards. This means a compositor-side change to the active-phase stream (a velocity curve, scroll multiplier) propagates uniformly to every app they all just apply deltaY linearly. But a compositor-side post-lift kinetic engine would double-stack with apps that already simulate their own, producing wrong feel in Firefox / Thunar / etc.

I don't think you will ever get a macOS scroll feel. seems impossible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:config Config parsing, default config, new settings area:input Keyboard, mouse, touchpad, tablet, gestures, pointer pr kind:feature New features and functionality

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants